Probabilistic Recurrent State-Space Models

نویسندگان

  • Andreas Doerr
  • Christian Daniel
  • Martin Schiegg
  • Duy Nguyen-Tuong
  • Stefan Schaal
  • Marc Toussaint
  • Sebastian Trimpe
چکیده

State-space models (SSMs) are a highly expressive model class for learning patterns in time series data and for system identification. Deterministic versions of SSMs (e.g., LSTMs) proved extremely successful in modeling complex timeseries data. Fully probabilistic SSMs, however, unfortunately often prove hard to train, even for smaller problems. To overcome this limitation, we propose a scalable initialization and training algorithm based on doubly stochastic variational inference and Gaussian processes. In the variational approximation we propose in contrast to related approaches to fully capture the latent state temporal correlations to allow for robust training.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Structured Inference Networks for Nonlinear State Space Models

Gaussian state space models have been used for decades as generative models of sequential data. They admit an intuitive probabilistic interpretation, have a simple functional form, and enjoy widespread adoption. We introduce a unified algorithm to efficiently learn a broad class of linear and non-linear state space models, including variants where the emission and transition distributions are m...

متن کامل

Advanced Probabilistic Models for Speech and Language

This project proposal applies to the statistical and machine learning models used in speech and language research. It aims to develop a more general theoretical framework for the use of state-space models in speech and language, and to expand the set of models currently used in these fields. Probabilistic Models State-Space Models Generalized SSMs Nonlinear SSMs Bayesian Learning Inferring Traj...

متن کامل

Predictive-State Decoders: Encoding the Future into Recurrent Networks

Recurrent neural networks (RNNs) are a vital modeling technique that rely on internal states learned indirectly by optimization of a supervised, unsupervised, or reinforcement training loss. RNNs are used to model dynamic processes that are characterized by underlying latent states whose form is often unknown, precluding its analytic representation inside an RNN. In the Predictive-State Represe...

متن کامل

Deep Recurrent Neural Network-Based Autoencoders for Acoustic Novelty Detection

In the emerging field of acoustic novelty detection, most research efforts are devoted to probabilistic approaches such as mixture models or state-space models. Only recent studies introduced (pseudo-)generative models for acoustic novelty detection with recurrent neural networks in the form of an autoencoder. In these approaches, auditory spectral features of the next short term frame are pred...

متن کامل

Learning Capabilities of Recurrent Neural

Recurrent neural networks are important models of neural computation This work relates the power of them to those of other conventional models of computation like Turing machines and nite automata and proves results about their learning capabilities Speci cally it shows a Probabilistic recurrent networks and Probabilistic turing ma chine models are equivalent b Probabilistic recurrent networks ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018